Convergence of Alternating Least Squares Optimisation for Rank-One Approximation to High Order Tensors

نویسندگان

  • Mike Espig
  • Aram Khachatryan
چکیده

The approximation of tensors has important applications in various disciplines, but it remains an extremely challenging task. It is well known that tensors of higher order can fail to have best low-rank approximations, but with an important exception that best rank-one approximations always exists. The most popular approach to low-rank approximation is the alternating least squares (ALS) method. The convergence of the alternating least squares algorithm for the rank-one approximation problem is analysed in this paper. In our analysis we are focusing on the global convergence and the rate of convergence of the ALS algorithm. It is shown that the ALS method can converge sublinearly, Q-linearly, and even Q-superlinearly. Our theoretical results are demonstrated on explicit examples.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On the Global Convergence of the Alternating Least Squares Method for Rank-One Approximation to Generic Tensors

Tensor decomposition has important applications in various disciplines, but it remains an extremely challenging task even to this date. A slightly more manageable endeavor has been to find a low rank approximation in place of the decomposition. Even for this less stringent undertaking, it is an established fact that tensors beyond matrices can fail to have best low rank approximations, with the...

متن کامل

Orthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence

With the notable exceptions of two cases — that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption that only one set...

متن کامل

On best rank one approximation of tensors

Today, compact and reduced data representations using low rank data approximation are common to represent high-dimensional data sets in many application areas as for example genomics, multimedia, quantum chemistry, social networks or visualization. In order to produce such low rank data representations, the input data is typically approximated by so-called alternating least squares (ALS) algori...

متن کامل

TECHNISCHE UNIVERSITÄT BERLIN On best rank one approximation of tensors

In this paper we suggest a new algorithm for the computation of a best rank one approximation of tensors, called alternating singular value decomposition. This method is based on the computation of maximal singular values and the corresponding singular vectors of matrices. We also introduce a modification for this method and the alternating least squares method, which ensures that alternating i...

متن کامل

Iterative Methods for Symmetric Outer Product Tensor Decomposition

We study the symmetric outer product for tensors. Specifically, we look at decomposition of fully (partially) symmetric tensor into a sum of rank-one fully (partially) symmetric tensors. We present an iterative technique for the third-order partially symmetric tensor and fourthorder fully and partially symmetric tensor. We included several numerical examples which indicate a faster convergence ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014